i?-THEORY FOR MARKOV CHAINS

نویسنده

  • RICHARD L. TWEEDIE
چکیده

If {Xn} is a discrete-time ^-irreducible Markov chain on a measure space (#*, 3?) [4; p. 4], with «-step transition probabilities P"(x, A), it has been shown in [5] that there exists a subset ^R of 3F with the property that, for every Ae^R and 0-almost all x € W, the power series J^P(x, A)z" have the same radius of convergence R. Moreover, there is a countable partition of 2£ all of whose elements belong to f̂R. If all the power series diverge for z = R, and {X,,} is aperiodic, then there is a second subset €L of $F such that for any A e #L, lim P"(x> A)R n = n(x, A) < oo n-»oo exists for almost all xe%. The state space % can again be countably partitioned into elements of #L. In this paper we assume that a topology exists on <X, and investigate continuity conditions on the transition probabilities of {X,,} which will ensure that compact elements of fF lie in either gR or #L. A condition sufficient for both these desirable attributes is given in §3, and in §4 we consider weakening this condition. Examples are given to show that under the weaker condition compact sets may or may not belong to €R or #L, and some auxiliary conditions on 2£ are found which make the weaker continuity conditions sufficient for compact sets to belong to ̂ R and # t .

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Rate of Rényi Entropy for Irreducible Markov Chains

In this paper, we obtain the R&eacute;nyi entropy rate for irreducible-aperiodic Markov chains with countable state space, using the theory of countable nonnegative matrices. We also obtain the bound for the rate of R&eacute;nyi entropy of an irreducible Markov chain. Finally, we show that the bound for the R&eacute;nyi entropy rate is the Shannon entropy rate.

متن کامل

Probabilistic Sufficiency and Algorithmic Sufficiency from the point of view of Information Theory

‎Given the importance of Markov chains in information theory‎, ‎the definition of conditional probability for these random processes can also be defined in terms of mutual information‎. ‎In this paper‎, ‎the relationship between the concept of sufficiency and Markov chains from the perspective of information theory and the relationship between probabilistic sufficiency and algorithmic sufficien...

متن کامل

Empirical Bayes Estimation in Nonstationary Markov chains

Estimation procedures for nonstationary Markov chains appear to be relatively sparse. This work introduces empirical&nbsp; Bayes estimators&nbsp; for the transition probability&nbsp; matrix of a finite nonstationary&nbsp; Markov chain. The data are assumed to be of&nbsp; a panel study type in which each data set consists of a sequence of observations on N>=2&nbsp;independent and identically dis...

متن کامل

Evaluation of First and Second Markov Chains Sensitivity and Specificity as Statistical Approach for Prediction of Sequences of Genes in Virus Double Strand DNA Genomes

Growing amount of information on biological sequences has made application of statistical approaches necessary for modeling and estimation of their functions. In this paper, sensitivity and specificity of the first and second Markov chains for prediction of genes was evaluated using the complete double stranded  DNA virus. There were two approaches for prediction of each Markov Model parameter,...

متن کامل

Markov Chains and Applications

In this paper I provide a quick overview of Stochastic processes and then quickly delve into a discussion of Markov Chains. There is some assumed knowledge of basic calculus, probability, and matrix theory. I build up Markov Chain theory towards a limit theorem. I prove the Fundamental Theorem of Markov Chains relating the stationary distribution to the limiting distribution. I then employ this...

متن کامل

Taylor Expansion for the Entropy Rate of Hidden Markov Chains

We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006